Lov&Data

4/2023: Immaterialrett
20/12/2023

2021 – a face oddysey?

Av Milton Rehnlund Ingblad, Associate på Bird & Bird advokatbyrå i Stockholm och specialiseras sig på immaterialrätt och tvist.

The text "Bird & Bird" in blue, with large initials, on a white background, digital illustration.
A screen scanning a face, digital illustration.

Illustrasjon: Colourbox.com

“Computers will overtake humans with AI at some point within the next 100 years. When that happens, we need to make sure the computers have goals aligned with ours” – Stephen Hawking.

The world of AI, has evolved at lightning speed since its beginnings in the 1950s. One of the systems that utilize AI is facial recognition technology (FRT). FRT works by the AI-technology creating a blueprint of your face based on a number of parameters, for example, the distance between the eyes or the shape of the chin. One of the applications is the use of FRT in law enforcement. The faceprint created from images is searched against a database of faces – not much unlike a fingerprint. The video or image material can originate from anywhere from surveillance cameras to a private video from a smartphone. The technology is being used more and more by the police and in some parts of the world they have even started using real-time FRT, where the time from when your face is captured to the time an identification is made is instant.

However, the use of FRT by law enforcement is also accompanied by a great deal of risks. This powerful tool that the police possess could entail a considerable infringement on fundamental rights. In light of this, the European Commission put forth a proposal in 2021 for an act to regulate the use of AI – COM (2021)206 final Proposal for a Regulation of the European Parliament and of the Council laying down harmonized rules for artificial intelligence (artificial intelligence act) and amending certain Union legislative acts, or as it is more known, the AI Act. Because as technology evolves at a rapid pace, the law must follow. The proposal plays a big part in the Commission’s greater ambitions for a digitalized Union that at the same considers the ethical implications of AI.

Moreover, it could come to be a global standard, just like the General Data Protection Regulation (GDPR) did, determining to what extent AI has a positive effect rather than a negative. It cannot be denied that the AI Act will have a great impact on all areas of society – especially on the police and their ability to use FRT for law enforcement purposes. The big thing of course being the outright ban on the use of real-time FRT, except for a few limited situations. The police authorities believe that the act will constitute a big hindrance to their work and while the criminals use the same technology, they are left behind unable to catch up. At the same time, others believe that the act will not at all involve the big limitation on the use of FRT as it would appear at first sight.

How will the AI-act affect current law and the police’s ability to use facial recognition for law enforcement purposes?

The processing of personal data when carried out by public authorities for law enforcement purposes is currently regulated in Directive 2016/680, Law Enforcement Directive(LED). It corresponds to Article 5 of GDPR with some minor changes to adapt it to processing by law enforcement authorities. The European Data Protection Board has also in Guidelines 05/2022 provided law enforcement authorities with a clarification on how they should use FRT and how to ensure the processing of personal data is lawful in accordance with LED.

Under the AI Act, the use of FRT will be strictly regulated. Certain uses of AI systems, like real-time remote biometric identification in public places for law enforcement purposes, will be prohibited unless certain specific conditions are met. These conditions focus on the targeted search of specific crime victims, including missing children, prevention of a specific, significant, and imminent threat to life or physical safety, and the detection, location, identification, or prosecution of perpetrators of serious offenses.

The Act also defines high-risk AI systems, which are allowed to operate if they meet mandatory requirements and undergo a prior conformity assessment. The requirements include implementation of a risk management system, data accuracy, relevance, and representativeness, clear technical documentation, transparency, user access to processed information, human review, and a robust level of accuracy and cybersecurity. The Act also outlines the responsibilities of AI system providers and the procedures for third-party assessment compliance checks.

The proposed AI Act poses certain challenges for law enforcement authorities seeking to use FRT. While the Act clearly prohibits the use of real-time remote biometric identification in public spaces for law enforcement, it lacks clarity on other aspects of FRT use. This ambiguity could make it difficult for police authorities in the Union to determine the appropriate and legal use of FRT.

Despite these uncertainties, it’s likely that the Act, in its current form, won’t cause significant issues for law enforcement authorities. The real concern lies with those who hoped the Act would restrict the use of real-time FRT by police, as the Act’s exceptions allow for a broad range of FRT uses that seem to contradict the Act’s purpose.

In essence, these exceptions act more as the main rule than the prohibition itself, which seems contradictory if the Act’s purpose is to limit the use of real-time FRT to a few specific objectives. This contradiction is especially notable considering that the Law Enforcement Directive (LED) already seems to prohibit more uses of real-time FRT by law enforcement than the AI Act would.

Regarding the interplay with current law, the AI Act would replace the LED in the context of real-time FRT systems. For high-risk systems, there would be an interaction between the two regulations - a use that is allowed under the AI Act might not be in accordance with the LED, and vice versa.

Potential problems with the act

The AI Act present several significant issues.

First, the Act’s ambiguities make it difficult to interpret and could lead to future problems. The wording of the exceptions to the Act’s rules seems to contradict its purpose, and it’s unclear which uses fall under these exceptions. Moreover, the use of post-FRT (facial recognition not conducted in real-time) by law enforcement appears to be left unregulated, which calls for further clarification.

Second, while the Act is designed to be future-proof, its mechanism for updating the list of high-risk systems seems inadequate. Systems introduced after the Act that do not fall into the described uses may be left outside its scope, creating a loophole for potentially risky general-purpose systems to be unregulated.

Third, the AI Act’s interplay with the Prüm II proposal needs to be more carefully considered. The use of FRT allowed under the Act could lay the groundwork for the automated data exchange that Prüm II proposes, and this intersection must be fully explored.

Solutions to the problems

There are some solutions to the aforementioned problems. For the prohibited uses of FRT, the exceptions should be applicable only in specific cases and to a much smaller extent than what they appear to be today. This would ensure that real-time FRT is only used when it is absolutely necessary to minimize the infringements on fundamental rights.

Regarding high-risk provisions, there are essentially three amendments that need to be made. First of all, clarifications should be made as to what systems are regarded as high-risk. Second, the use of post-FRT by law enforcement should be added to make sure it lives up to the requirements set out in the act. Lastly, it needs to be made easier to make additions to the list found in Annex III to ensure an act that will stand the test of time.

Another solution is to make a separate AI Act for law enforcement, similar to GDPR and LED which would ensure that the distinct balancing of rights that a use of FRT by law enforcement entails is signified.

The AI Act is hopefully a sign of great things to come and a step in the right direction for creating a Union fit for the digital age. However, it leaves much to be desired, and it is evident that the face odyssey has not quite reached its final destination just yet.

Milton Rehnlund Ingblad
Milton Ingblad, portrett